Home |
| Latest | About | Random
# 10 Some special matrices and terminologies. Some matrices that look a certain way have special names, we will learn some of them here. ## Identity matrix. The identity matrix is a **square matrix** where the **diagonal** entries are all $1$'s, and the rest are $0$, so something that looks like this: $$ I_{n}=\begin{bmatrix} 1 & \\ & 1 \\ & & \ddots \\ & & & 1 \end{bmatrix} $$Square matrix means there is equal number of rows and columns. We often denote the $n\times n$ identity matrix as $I_{n}$ or sometimes $I_{n\times n}$. Notice above the omitted entries are all $0$. And when we speak of the **diagonal**, it is always from the top-left to bottom-right. Again, our convention (that is going to be consistent with other parts of math later) of choice. The other "diagonal" is not diagonal, and does not have a general name. So $I_{2}=\begin{bmatrix}1 & 0\\0 & 1\end{bmatrix}$ and $I_{3}=\begin{bmatrix}1 & 0 & 0\\0 & 1 & 0\\ 0 & 0 & 1\end{bmatrix}$, and so on. If we are to describe the entries of an $n\times n$ identity matrix $I_{n}$, we have the following: $$ (I_{n})_{ij} = \begin{cases} 1 & \text{if \(i=j\)} \\ 0 & \text{if \(i\neq j\)} \end{cases} \ \ , $$ for all $0 \le i,j \le n$. Why is it called the identity matrix? This is because when the matrix product $AI_{n}$ or $I_{n}A$ is defined, we just get $A$ back! > **Proposition.** Identity matrix. > Let $A$ be an $n\times k$ matrix. Then $I_{n}A = A$ and $AI_{k}=A$. > Note well the size of the identity matrices here. And note, $A$ does not have to be square, but we need the correctly sized identity matrices for this to work. $\blacktriangleright$ Proof. Let $A$ be any $n\times k$ matrix, we wish to show $I_{n}A = A$. To show two matrices are the same, one way is to show that they have exactly the same entries at the same positions. In particular, show the $(i,j)$-th entry of $I_{n}A$ is the same as the $(i,j)$-th entry of $A$. Using the matrix product entry formula (see [[1 teaching/smc-spring-2024-math-13/linear-algebra-notes/09-entrywise-formulas-for-matrix-operations|notes 09]]), we can calculate the $(i,j)$-th entry of $I_{n}A$, which is $$ (I_{n}A)_{ij}=\sum_{t=0}^{n}(I_{n})_{it}(A)_{tj}. $$But $I_{n}$ is the identity matrix, so the entry $(I_{n})_{it}$ is zero unless $t=i$. So the sum becomes just having one term, namely$$ (I_{n}A)_{ij}=(I_{n})_{ii}(A)_{ij}=1\cdot(A)_{ij}=(A)_{ij}. $$ This shows the $(i,j)$-th entry of $I_{n}A$ is the $(i,j)$-th entry of $A$. Hence $I_{n}A=A$! For the other statement, that $AI_{k}=A$, I will leave it to you as an exercise. $\blacksquare$ **Remark.** When the context is clear, we may sometimes just write $I$ for identity matrix without specifying its size. But know this is generally ambiguous. ## Zero matrix. If a matrix has entries full of zeros, then it is a zero matrix. We typically denote $O_{n\times k}$ for an $n\times k$ matrix where every entry is a zero. If it is a square $n\times n$ zero matrix, we may also write $O_{n}=O_{n\times n}$. The entries of $O_{n\times k}$ are all zeros is saying that the $(i,j)$-th entry of $O_{n\times k}$ is $$ (O_{n\times k})_{ij}=0. $$for all $0\le i \le n$, $0 \le j \le k$. One can surmise some obvious properties of the zero matrix, and indeed we have: > **Proposition.** Zero matrix. > (1) If $A$ is of size $n\times k$, then $O_{n\times k} + A = A$. > (2) If $A$ is of size $n\times k$, then $O_{q\times n} A=O_{q\times k}$ and $A O_{k\times p} = O_{n\times p}$. The proof of these statements is also left to you as an exercise. A quick hint is again: One way to show two matrices are the same is to show that they have the same $(i,j)$-th entry, as above. **Remark.** When the context is clear, we may sometimes just write $O$ or even $0$ for the zero matrix without specifying its size. But this is again generally ambiguous. And using $0$ may be confusing for the scalar 0 as well, so keep that in mind. Nevertheless, sometimes we do write $0_{n\times k}$ for the $n\times k$ zero matrix. ## Diagonal matrices. A matrix is a **diagonal matrix** if it is a **square matrix**, meaning $n\times n$, and that every entry that is not on the diagonal is zero. The entries on the diagonal can also be zero, it is just the off-diagonal entries must all be zero. In other words, if $A$ is an $n\times n$ diagonal matrix, then $$ (A)_{ij}=0 \quad \text{whenever \(i\neq j\).} $$Again, the diagonal entries are entries where the row index equals to the column index. Since everything that is not on the diagonal is a zero, it is sometimes enough to record just what's on the diagonal entry for a diagonal matrix. So we have the notation $$ A= \text{diag}(\lambda_{1}, \lambda_{2},\ldots,\lambda_{n}) = \begin{bmatrix} \lambda_{1} & & & 0\\ & \lambda_{2} \\ & & \ddots \\ 0 & & & \lambda_{n} \end{bmatrix} $$ for the $n\times n$ diagonal matrix $A$ whose diagonal entries are $\lambda_{1}, \lambda_{2},\ldots ,\lambda_{n}$. Note well that the identity matrices and square zero matrices are all diagonal matrices. These matrices will be very special to us later, so keep this in mind. ## Symmetric matrices. A matrix $A$ is symmetric if and only if $A=A^{T}$. That after you take the transpose of it, you get a matrix that looks exactly the same as the original matrix. Notice if a matrix is symmetric, then it must be a square matrix of some size $n\times n$. To define this property of symmetric matrices, we can also look at the entries. If $A$ is a symmetric matrix, then: $$ (A)_{ij}=(A)_{ji}. $$That is to say, the $(i,j)$-th entry of $A$ is the same as the $(j,i)$-th entry of $A$, which you recall is the $(i,j)$-th entry of $A^{T}$ (see [[1 teaching/smc-spring-2024-math-13/linear-algebra-notes/09-entrywise-formulas-for-matrix-operations|notes 09]]). For example, identity matrices and square zero matrices are all symmetric matrices. Later these will also be very special to us. Keep these in mind. ## Triangular matrices. Triangular matrices are also square matrices, but entries strictly above the diagonal are all zero, or entries strictly below the diagonal are all zero -- forming a triangular shape. The former is called a lower triangular matrix, the latter is called an upper triangular matrix: In particular, a **lower triangular matrix** $L$ of size $n\times n$ looks like this: $$ L=\begin{bmatrix} \ell_{11} & & & & 0\\ \ell_{21} & \ell_{22} \\ \ell_{31} & \ell_{32} & \ell_{33} \\ \vdots & & & \ddots\\ \ell_{n1} & \ell_{n2} & \ell_{n3} & \cdots & \ell_{nn} \end{bmatrix}. $$ Again, the omitted entries are zero. The entries of a lower triangular matrix $L$ has the property that $$ (L)_{ij} = 0\quad\text{if \(i < j\).} $$Note, the other entries $(L)_{ij}$ **can also also be zero** for $i \ge j$, it is just the strictly above diagonal entries must be zero for a lower triangular matrix. Similarly, an **upper triangular matrix** $R$ of size $n\times n$ looks like this $$ R=\begin{bmatrix} r_{11} & r_{12} & r_{13} & \cdots & r_{1n} \\ & r_{22} & r_{23} & \cdots & r_{2n} \\ & & r_{33} & \cdots & r_{3n} \\ & & & \ddots & \vdots \\ 0 & & & & r_{nn} \end{bmatrix} $$The omitted entries are again zero, and that the entries of an upper triangular matrix $R$ has the property that $$ (R)_{ij}=0\quad\text{if \(i > j\)}, $$ and the other entries $(R)_{ij}$ **can also be zero** for $i \le j$. Note, we say a matrix is a **triangular matrix** if it is either a lower triangular matrix or an upper triangular matrix. Some simple examples, the identity matrices and the square zero matrices are both triangular matrices. And in particular they are both lower and upper triangular matrices. We will encounter more matrices with special names later on, so this is just the beginning of our matrix friends!